|
In statistics, the Rao–Blackwell theorem, sometimes referred to as the Rao–Blackwell–Kolmogorov theorem, is a result which characterizes the transformation of an arbitrarily crude estimator into an estimator that is optimal by the mean-squared-error criterion or any of a variety of similar criteria. The Rao–Blackwell theorem states that if ''g''(''X'') is any kind of estimator of a parameter θ, then the conditional expectation of ''g''(''X'') given ''T''(''X''), where ''T'' is a sufficient statistic, is typically a better estimator of θ, and is never worse. Sometimes one can very easily construct a very crude estimator ''g''(''X''), and then evaluate that conditional expected value to get an estimator that is in various senses optimal. The theorem is named after Calyampudi Radhakrishna Rao and David Blackwell. The process of transforming an estimator using the Rao–Blackwell theorem is sometimes called Rao–Blackwellization. The transformed estimator is called the Rao–Blackwell estimator. ==Definitions== *An estimator δ(''X'') is an ''observable'' random variable (i.e. a statistic) used for estimating some ''unobservable'' quantity. For example, one may be unable to observe the average height of ''all'' male students at the University of X, but one may observe the heights of a random sample of 40 of them. The average height of those 40—the "sample average"—may be used as an estimator of the unobservable "population average". *A sufficient statistic ''T''(''X'') is a statistic calculated from data ''X'' to estimate some parameter θ for which it is true that no other statistic which can be calculated from data X provides any additional information about θ. It is defined as an ''observable'' random variable such that the conditional probability distribution of all observable data ''X'' given ''T''(''X'') does not depend on the ''unobservable'' parameter θ, such as the mean or standard deviation of the whole population from which the data ''X'' was taken. In the most frequently cited examples, the "unobservable" quantities are parameters that parametrize a known family of probability distributions according to which the data are distributed. ::In other words, a sufficient statistic ''T(X)'' for a parameter θ is a statistic such that the conditional distribution of the data ''X'', given ''T''(''X''), does not depend on the parameter θ. *A Rao–Blackwell estimator δ1(''X'') of an unobservable quantity θ is the conditional expected value E(δ(''X'') | ''T''(''X'')) of some estimator δ(''X'') given a sufficient statistic ''T''(''X''). Call δ(''X'') the "original estimator" and δ1(''X'') the "improved estimator". It is important that the improved estimator be ''observable'', i.e. that it not depend on θ. Generally, the conditional expected value of one function of these data given another function of these data ''does'' depend on θ, but the very definition of sufficiency given above entails that this one does not. *The ''mean squared error'' of an estimator is the expected value of the square of its deviation from the unobservable quantity being estimated. 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Rao–Blackwell theorem」の詳細全文を読む スポンサード リンク
|